Cache-on-Demand: Recycling with Certainty
نویسندگان
چکیده
Queries posed to a database usually access some common relations, or share some common sub-expressions. In this paper; we examine the issue of caching using a novel framework, called cache-on-demand (COD). COD views intermediate/final answers of existing running queries as virtual caches that an incoming query can exploit. Those caches that are beneficial may then be materialized for the incoming query. Such an approach is essentially nonspeculative: the exact cost of investment and the return on investment are known, and the cache is certain to be reused! We addressed several issues fo r COD to be realized. We also propose two optimizing strategies, Conform-COD and Scramble-COD, and evaluated their performance. Our results show that COD-based schemes can provide substantial performance improvement.
منابع مشابه
An Ant Colony approach to forward-reverse logistics network design under demand certainty
Forward-reverse logistics network has remained a subject of intensive research over the past few years. It is of significant importance to be issued in a supply chain because it affects responsiveness of supply chains. In real world, problems are needed to be formulated. These problems usually involve objectives such as cost, quality, and customers' responsiveness and so on. To this reason, we ...
متن کاملCloudCache: On-demand Flash Cache Management for Cloud Computing
Host-side flash caching has emerged as a promising solution to the scalability problem of virtual machine (VM) storage in cloud computing systems, but it still faces serious limitations in capacity and endurance. This paper presents CloudCache, an on-demand cache management solution to meet VM cache demands and minimize cache wear-out. First, to support on-demand cache allocation, the paper pro...
متن کاملPhased Drowsy I-cache with On-demand Wakeup Prediction Policy for High-performance Low-energy Microprocessors
In this paper, we propose a phased drowsy instruction cache with on-demand wakeup prediction policy (called “phased on-demand policy”) to reduce the leakage and dynamic energy with less performance overhead. As in prior non-phased on-demand policy, an extra stage for wakeup is inserted before the fetch stage in pipeline. The drowsy cache lines are woken up in wakeup stage and the wakeup latency...
متن کاملUtility-Based Cache Partitioning
This paper investigates the problem of partitioning a shared cache between multiple concurrently executing applications. The commonly used LRU policy implicitly partitions a shared cache on a demand basis, giving more cache resources to the application that has a high demand and fewer cache resources to the application that has a low demand. However, a higher demand for cache resources does not...
متن کاملReduction in Cache Memory Power Consumption based on Replacement Quantity
Today power consumption is considered to be one of the important issues. Therefore, its reduction plays a considerable role in developing systems. Previous studies have shown that approximately 50% of total power consumption is used in cache memories. There is a direct relationship between power consumption and replacement quantity made in cache. The less the number of replacements is, the less...
متن کامل